Nonmanual
Fundamentals of formal properties of nonmanuals:
A quantitative approach
About the project
The project “Fundamentals of formal properties of nonmanuals: A quantitative approach” (NONMANUAL) is funded by the European Research Council (link to the project in the ERC Datahub). It will run from January 2023 to December 2027.
If you are interested in collaborating on topics related to this project, do not hesitate to email me.
Summary
Sign languages, in addition to using the hands, also use positions and movements of other articulators: the body, the head, the mouth, the eyebrows, the eyes and the eyelids, to convey lexical, grammatical, and prosodic information. This linguistic use of the nonmanual articulators is known as nonmanuals. Contrary to current assumptions in the field of sign linguistics, this project proposes the hypothesis that all sign languages use the same basic universal building blocks (nonmanual movements) but that each language is different in how it combines these building blocks both sequentially and simultaneously; languages also differ in the regularity, frequency, and the alignment properties of the nonmanuals.
In order to test this hypothesis, the project will investigate formal properties of nonmanuals in five geographically, historically, and socially diverse sign languages using data from published naturalistic corpora of the sign languages, Computer Vision for extracting measurements of the movement of nonmanual articulators, and a statistical techniques of Non-linear Mixed Effect Modelling and Functional Data Analysis for a quantitative comparison of dynamic nonmanual contours. This will result in the first quantitative formal typology of nonmanuals grounded in naturalistic corpus data. The novel methodology proposed in this project requires testing, adjustment, and development, which constitutes an important component of the project. The developed methodological pipeline will be a secondary output enabling large-scale reliable quantitative research on nonmanuals in future.
Finally, the established typology of formal properties of nonmanuals in the five sign languages will serve as basis for a cross-modal comparison between nonmanuals and prosody/intonation in spoken languages in order to separate truly universal features of the human linguistic capacity from the effects of the visual vs. auditory modalities.
The team
-
PI: Vadim Kimmelman
-
Postdoctoral researcher: Allah Bux
-
PhD student: Margaux Susman
-
PhD student: Lorena Figueiredo
-
Statistician: Jan Bulla
-
Research Assistants: Laurence Crettenand
Collaborations with Anna Kuznetsova, Marloes Oomen, Roland Pfau, Connie de Vos, Josefina Safar, Ari Price, Anastasiia Chizhikova, Anara Sandygulova, Medet Mukushev, Alfarabi Imashev, Anželika Teresė.
Project publications
Kimmelman, V., M. Oomen & R. Pfau (2024) Headshakes in NGT: Relation between Phonetic Properties & Linguistic Functions Proceedings of LREC-SL 2024. [pdf]
- We use OpenFace to measure head rotation during headshakes expressing negation in NGT (Sign Language of the Netherlands). We find that some of the phonetic/kinetic measures of headshakes correlate with their linguistic functions.
Kimmelman, V., A. Price, J. Safar, C. de Vos & J. Bulla (2024) Nonmanual Marking of Questions in Balinese Homesign Interactions: a Computer-Vision Assisted Analysis. Proceedings of LREC-SL 2024. [pdf]
- We look at nonmanual marking of questions in 5 deaf homesigners from Bali. It turns out that polar questions and non-polar questions are marked by opposite directions of head movements, and this is consistent across homesigners. The analysis is based on a combination of manual annotation and extracting measurements of head tilt (pitch) with OpenFace.
Susman, M. & V. Kimmelman. (2024) Eye Blink Detection in Sign Language Data Using CNNs and Rule-Based Methods Proceedings of LREC-SL 2024. [pdf]
- Eye blinks are an important prosodic markers across sign languages. However, cross-linguistic research on these markers is almost non-existent. In order to enable cross-linguistic comparison, we develop and test two methods of automatic detection of eyeblinks. Both methods produce promising results.
Kuznetsova, A. & V. Kimmelman. (2024). Testing MediaPipe Holistic for Linguistic Analysis of Nonmanual Markers in Sign Languages. arXiv. http://arxiv.org/abs/2403.10367. (open access)
- We tested whether MediaPipe 3D reconstructed model of the face is distorted in the presense of head tilts (and compare it to the OpenFace 3D reconstructed model). It turns out that both models are substantially distorted in the presence of vertical head tilts (pitches), which makes it impossible to use eyebrow distance measurements from these models without additional corrections.
Kimmelman, V. & A. Teresė (2023). Analyzing Literary Texts in Lithuanian Sign Language with Computer Vision: A Proof of Concept. In R. Galimullin & S. Touileb. CEUR Workshop Proceedings vol. 3413. https://ceur-ws.org/Vol-3431/paper5.pdf (open access)
- In this study, we demonstrate how Computer Vision (MediaPipe) can be applied to analyze kinetic properties in literary pieces and their non-literary retellings in Lithuanian Sign Language. For example, the graph below shows that the eyebrows move more in the originals in comparison to the retellings of the same pieces.
Previous publications
Kimmelman, V., A. Imashev, M. Mukushev & A. Sandygulova. (2020). Eyebrow position in grammatical and emotional expressions in Kazakh-Russian Sign Language: A quantitative study. PLOS ONE 15(6). https://doi.org/10.1371/journal.pone.0233731 (open access)
- In this study, we applied a Computer Vision tool (OpenPose) to quantitatively analyze eyebrow position as affected by three sentence types and three different emotions in utterances produced by ten signers. See a video below demonstrating application of the tool and sentence types.
Kuznetsova, A., A. Imashev, M. Mukushev, A. Sandygulova & V. Kimmelman (2021). Using Computer Vision to Analyze Non-manual Marking of Questions in KRSL. In D. Shterionov (ed.) Proceedings of the 1st International Workshop on Automatic Translation for Signed and Spoken Languages (AT4SSL), (pp. 49-59). Association for Machine Translation in the Americas. https://aclanthology.org/2021.mtsummit-at4ssl.6/ (open access)
- In this study, we re-analyzed parts of the data from the previous study using a different CV-tool (OpenFace) and applying Machine Learning to improve the measurements of eyebrow movements.
Kuznetsova, A., Imashev, A., Mukushev, M., Sandygulova, A., & Kimmelman, V. (2022). Functional Data Analysis of Non-manual Marking of Questions in Kazakh-Russian Sign Language. In E. Efthimiou, S.-E. Fotinea, T. Hanke, J. A. Hochgesang, J. Kristoffersen, J. Mesch, & M. Schulder (Eds.), Proceedings of the LREC2022 10th Workshop on the Representation and Processing of Sign Languages: Multilingual Sign Language Resources (pp. 124-131). European Language Resources Association (ELRA). https://www.sign-lang.uni-hamburg.de/lrec/pub/22024.pdf
- In this study, we improved the statistical analysis of the data from the previous study in order to account for the dynamic nature of eyebrow and head movements. The figure below shows application of FDA analysis to head movement and inner and outer eyebrow movement across different sentence types with and without landmark registration (time alignment to sign boundaries).
Chizhikova, A., & Kimmelman, V. (2022). Phonetics of Negative Headshake in Russian Sign Language: A Small-Scale Corpus Study. In E. Efthimiou, S.-E. Fotinea, T. Hanke, J. A. Hochgesang, J. Kristoffersen, J. Mesch, & M. Schulder (Eds.), Proceedings of the LREC2022 10th Workshop on the Representation and Processing of Sign Languages: Multilingual Sign Language Resources (pp. 29-36). European Language Resources Association (ELRA). https://www. sign-lang.uni-hamburg.de/lrec/pub/22011.pdf
- In this study, we analyzed phonetic properties of headshake expressing negation in Russian Sign Language using OpenFace. Example of a headshake measured as head rotation in OpenFace: